2 days ago
Powering Intelligence: The Energy Challenge Behind The AI Revolution
Diganta Sengupta is a seasoned technology leader with deep expertise in artificial intelligence, Gen AI, Cloud computing and blockchain.
The rise of generative AI, from small-scale models to large language models (LLMs) in particular, represents a turning point in our digital evolution. These models, which are capable of producing human-like text, making decisions for people and sometimes even speeding up the pace of science, are moving industries at a breathtaking rate.
But with this transformation comes one massive and often overlooked consequence—our skyrocketing thirst for electricity.
The Energy Demands Of Intelligence
A recent IEEE PES technical report indicates that investment in new data centers has dramatically increased. This growth, partly driven by generative AI workloads and the broader digital economy, means data centers are expected to form up to 44% of total U.S. electricity load growth over the next three or five years.
According to the report, some experts project that data centers may use 9% to 12% of all U.S. electricity by 2028. That contrasts sharply with one AI task: A ChatGPT query needs around 2.9 watt-hours (Wh) of power—10 times that of a typical Google search. Multiply that by the billions of searches made daily around the globe, and you have big numbers.
More alarming is that the scale of new hyperscale and AI-optimized facilities has continued to increase. Now, data center developers are planning campuses of 500 MW to 2 GW—whole cities in terms of electrical power demand. For context, a 1 GW load can provide electricity for 800,000 to 1 million homes in the U.S. These data centers are not just another category of demand—they are increasingly becoming the 21st-century tech industry's new backbone.
The Grid Is Feeling The Pressure
The electric grid, already groaning under the strain of the different trends in electrification (EVs, smart cities, renewable integration, etc.), now faces another challenge with an additional twist: Data centers have volatile, enormous and place-specific loads.
In this landscape, it's hard to forecast potential energy needs. Commercially confidential plans, fast-changing chip efficiency trends, self-generation and a regulatory landscape that varies with each section line are all contributing to the forecasting miasma. Forecasts by some organizations might show that some coastal areas will require 5% of their local grid capacity for data centers, while others may expect usage rates up to 15% within just two or three years.
Also, it's worth mentioning that our substations and transmission lines were never built for AI-era demand peaks. A single large data center can swamp local infrastructure, requiring years of permits and construction work to upgrade.
Grid interconnection applications have also risen meteorically. In some regions, like I've seen in Texas and PJM regions, data centers are giving rise to new levels of reliability and infrastructure coordination efforts.
It is also possible that this thirst for digital intelligence will thwart decarbonization. As energy needs grow, the goal of reducing greenhouse gas emissions may become unattainable—unless AI growth champions energy sources that are clean and reliable.
Digital Intelligence, Physical Fragility
According to IEEE's recent Energy Sustainability Magazine, we must look at AI not only as a new tool, but also as a driving force behind energy consumption. The AI community must realize that it can shape sustainable demand.
Although I applaud AI's role in optimizing grids, money-saving waste reduction and bringing social infrastructure online, I believe we also must consider whether all this resource consumption is justified. Data centers not only stress power systems—they also use lots of water (for cooling) and throw away tons of electronic waste due to obsolete hardware that cannot be recycled.
This intersection of digital and physical systems demands a new model. We need to move from disconnected planning to one that champions sustainability, reliability and computer power.
So, What Can We Do?
As an industry professional leading multiple AI, cloud and infrastructure transformation projects for clients across energy, utilities and public sector organizations, I believe the way forward must include the following five imperatives:
1. Grid Aware AI Design: We should schedule our rollout of AI alongside low-carbon infrastructure. This will make it so that LLMs are either trained or put into operation at times when renewable energy is abundant and grids are least stressed.
2. Behind-The-Meter Generation: We need to build clean energy resources (solar, hydrogen, etc.) alongside data centers to reduce grid impacts while promoting corporate ESG goals. However, this kind of design must have policy clarity and is not without its reliability risks.
3. Flexible AI Workloads: Some AI jobs are not time-critical. We should keep this in mind when building AI into grid assets and keep the fact that this tech can be both an asset and a liability.
4. Decentralized Edge Intelligence: By doing more computations on edge devices that are closer to where data is generated, we can save in local backhaul, latency and energy use, also making way for more resilient mini-grids or even modular energy systems.
5. Cross-Sectoral Cooperation: I believe that we need the whole industry to pull together. This includes utilities, large cloud operators, government regulation agencies and universities. This isn't simply an engineering problem; it's actually a question of governance and policy that will shape our digital future.
It's Time For The AI Ecosystem To Wake Up
In the last decade, we have been optimizing our systems for algorithmic efficiency. But now is the time to optimize energy efficiency, grid stability and social resilience.
The AI we are constructing has great strength, but that strength comes without power—in the literal sense. Don't forget that each intelligent prompt, video stream and AI-generated insight affects the environment. As engineers, architects and practitioners of AI, it is our responsibility to make sure this intelligence is not just transformative, but sustainable.
Forbes Technology Council is an invitation-only community for world-class CIOs, CTOs and technology executives. Do I qualify?